Gaussian Processes Classification and its PAC-Bayes Generalization Error Bounds – CSE 291 Project Report
نویسنده
چکیده
McAllester’s PAC-Bayes theorem (strengthened by [4]) characterizes the convergence of a stochastic classifier’s empirical error to its generalization error. Fixed one ”prior” distribution P (h) over hypothesis space H, the theorem can hold for all ”posterior” distribution Q(h) over H simultaneously, so in practice we can find a data-dependent posterior distribution overH as the distribution of final stochastic classifier,
منابع مشابه
PAC-Bayesian Theorems for Gaussian Process Classification
We present distribution-free generalization error bounds which apply to a wide class of approximate Bayesian Gaussian process classification (GPC) techniques, powerful nonparametric learning methods similar to Support Vector machines. The bounds use the PACBayesian theorem [8] for which we provide a simplified proof, leading to new insights into its relation to traditional VC type union bound t...
متن کاملPAC-Bayesian Generalization Error Bounds for Gaussian Process Classification
Approximate Bayesian Gaussian process (GP) classification techniques are powerful nonparametric learning methods, similar in appearance and performance to Support Vector machines. Based on simple probabilistic models, they render interpretable results and can be embedded in Bayesian frameworks for model selection, feature selection, etc. In this paper, by applying the PAC-Bayesian theorem of nc...
متن کاملData Dependent Priors in PAC-Bayes Bounds
One of the central aims of Statistical Learning Theory is the bounding of the test set performance of classifiers trained with i.i.d. data. For Support Vector Machines the tightest technique for assessing this so-called generalisation error is known as the PAC-Bayes theorem. The bound holds independently of the choice of prior, but better priors lead to sharper bounds. The priors leading to the...
متن کاملPAC-Bayesian Generalization Bound on Confusion Matrix for Multi-Class Classification
In this work, we propose a PAC-Bayes bound for the generalization risk of the Gibbs classifier in the multi-class classification framework. The novelty of our work is the critical use of the confusion matrix of a classifier as an error measure; this puts our contribution in the line of work aiming at dealing with performance measure that are richer than mere scalar criterion such as the misclas...
متن کاملChromatic PAC-Bayes Bounds for Non-IID Data: Applications to Ranking and Stationary β-Mixing Processes
PAC-Bayes bounds are among the most accurate generalization bounds for classifiers learned from independently and identically distributed (IID) data, and it is particularly so for margin classifiers: there have been recent contributions showing how practical these bounds can be either to perform model selection (Ambroladze et al., 2007) or even to directly guide the learning of linear classifie...
متن کامل